Combining Neural Network Voting Classiiers and Error Correcting Output Codes
نویسنده
چکیده
We show that error correcting output codes (ECOC) can further improve the eeects of error dependent adaptive resampling methods such as arc-lh. In traditional one-inn coding, the distance between two binary class labels is rather small, whereas ECOC are chosen to maximize this distance. We compare one-inn and ECOC on a multiclass data set using standard MLPs and bagging and arcing voting committees.
منابع مشابه
Combining Neural Network Voting Classifiers and Error Correcting Output Codes
Papers published in this report series are preliminary versions of journal articles and not for quotations. Abstract We show that error correcting output codes (ECOC) can further improve the eeects of error dependent adaptive resampling methods such as arc-lh. In traditional one-inn coding, the distance between two binary class labels is rather small, whereas ECOC are chosen to maximize this di...
متن کاملCombining Nearest Neighbor Classi ers Through Multiple
Combining multiple classiiers is an eeective technique for improving accuracy. There are many general combining algorithms, such as Bagging or Error Correcting Output Coding, that signiicantly improve classiiers like decision trees, rule learners, or neural networks. Unfortunately, many combining methods do not improve the nearest neighbor classiier. In this paper, we present MFS, a combining a...
متن کاملExtending Local Learners with Error-correcting Output Codes Extending Local Learners with Error-correcting Output Codes
Error-correcting output codes (ECOCs) represent classes with a set of output bits, where each bit encodes a binary classiication task corresponding to a unique partition of the classes. Algorithms that use ECOCs learn the function corresponding to each bit, and combine them to generate class predictions. ECOCs can reduce both variance and bias errors for multiclass classiication tasks when the ...
متن کاملNearest neighbor classification from multiple feature subsets
Combining multiple classiiers is an eeective technique for improving accuracy. There are many general combining algorithms, such as Bagging, Boosting, or Error Correcting Output Coding, that signiicantly improve classiiers like decision trees, rule learners, or neural networks. Unfortunately, these combining methods do not improve the nearest neighbor classiier. In this paper, we present MFS, a...
متن کاملApplying Multiple Complementary Neural Networks to Solve Multiclass Classification Problem
In this paper, a multiclass classification problem is solved using multiple complementary neural networks. Two techniques are applied to multiple complementary neural networks which are one-against-all and error correcting output codes. We experiment our proposed techniques using an extremely imbalance data set named glass from the UCI machine learning repository. It is found that the combinati...
متن کامل